Restricted Strong Convexity Implies Weak Submodularity

نویسندگان

  • Ethan R. Elenberg
  • Rajiv Khanna
  • Alexandros G. Dimakis
  • Sahand Negahban
چکیده

We connect high-dimensional subset selection and submodular maximization. Our results extend the work of Das and Kempe (2011) from the setting of linear regression to arbitrary objective functions. For greedy feature selection, this connection allows us to obtain strong multiplicative performance bounds on several methods without statistical modeling assumptions. We also derive recovery guarantees of this form under standard assumptions. Our work shows that greedy algorithms perform within a constant factor from the best possible subset-selection solution for a broad class of general objective functions. Our methods allow a direct control over the number of obtained features as opposed to regularization parameters that only implicitly control sparsity. Our proof technique uses the concept of weak submodularity initially defined by Das and Kempe. We draw a connection between convex analysis and submodular set function theory which may be of independent interest for other statistical learning applications that have combinatorial structure.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Several Aspects of Antimatroids and Convex Geometries Master's Thesis

Convexity is important in several elds, and we have some theories on it. In this thesis, we discuss a kind of combinatorial convexity, in particular, antimatroids and convex geometries. An antimatroid is a combinatorial abstraction of convexity. It has some di erent origins; by Dilworth in lattice theory, by Edelman and Jamison in the notions of convexity, by Korte{Lov asz who were motivated by...

متن کامل

Polyhedral aspects of Submodularity, Convexity and Concavity

The seminal work by Edmonds [9] and Lovász [39] shows the strong connection between submodular functions and convex functions. Submodular functions have tight modular lower bounds, and a subdifferential structure [16] in a manner akin to convex functions. They also admit polynomial time algorithms for minimization and satisfy the Fenchel duality theorem [18] and the Discrete Seperation Theorem ...

متن کامل

Linear Convergence of the Randomized Feasible Descent Method Under the Weak Strong Convexity Assumption

In this paper we generalize the framework of the feasible descent method (FDM) to a randomized (R-FDM) and a coordinate-wise random feasible descent method (RC-FDM) framework. We show that the famous SDCA algorithm for optimizing the SVM dual problem, or the stochastic coordinate descent method for the LASSO problem, fits into the framework of RC-FDM. We prove linear convergence for both R-FDM ...

متن کامل

Characterizations of Convex Sets by Local Support Properties

It is our purpose to establish some new characterizations of convex sets by means of local properties and to derive as a consequence certain known results. This will be done for sets in a topological linear space 7, such a space being a real linear space with a Hausdorff topology such that the operations of vector addition x+y and scalar multiplication ax are continuous in both variables jointl...

متن کامل

Strong Displacement Convexity on Riemannian Manifolds

Ricci curvature bounds in Riemannian geometry are known to be equivalent to the weak convexity (convexity along at least one geodesic between any two points) of certain functionals in the space of probability measures. We prove that the weak convexity can be reinforced into strong (usual) convexity, thus solving a question left open in [4].

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1612.00804  شماره 

صفحات  -

تاریخ انتشار 2016